Instance-optimality in Probability with an `1-Minimization Decoder

نویسندگان

  • Ronald DeVore
  • Guergana Petrova
  • Przemyslaw Wojtaszczyk
چکیده

Let Φ(ω), ω ∈ Ω, be a family of n ×N random matrices whose entries φi,j are independent realizations of a symmetric, real random variable η with expectation IEη = 0 and variance IEη2 = 1/n. Such matrices are used in compressed sensing to encode a vector x ∈ IR by y = Φx. The information y holds about x is extracted by using a decoder ∆ : IR → IR . The most prominent decoder is the `1-minimization decoder ∆ which gives for a given y ∈ IR the element ∆(y) ∈ IR which has minimal `1-norm among all z ∈ IR with Φz = y. This paper is interested in properties of the random family Φ(ω) which guarantee that the vector x̄ := ∆(Φx) will with high probability approximate x in `2 to an accuracy comparable with the best k-term error of approximation in `2 for the range k ≤ an/ log2(N/n). This means that for the above range of k, for each signal x ∈ IR , the vector x̄ := ∆(Φx) satisfies ‖x− x̄‖`N2 ≤ C inf z∈Σk ‖x− z‖`N2 with high probability on the draw of Φ. Here, Σk consists of all vectors with at most k nonzero coordinates. The first result of this type was proved by Wojtaszczyk [19] who showed this property when η is a normalized Gaussian random variable. We extend this property to more general random variables, including the particular case where η is the Bernoulli random variable which takes the values ±1/ √ n with equal probability. The proofs of our results use geometric mapping properties of such random matrices some of which were recently obtained in [14].

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Sparse Recovery by Non - Convex Optimization –

In this note, we address the theoretical properties of ∆p, a class of compressed sensing decoders that rely on ℓ p minimization with p ∈ (0, 1) to recover estimates of sparse and compressible signals from incomplete and inaccurate measurements. In particular, we extend the results of Candès, Romberg and Tao [3] and Wojtaszczyk [30] regarding the decoder ∆ 1 , based on ℓ 1 minimization, to ∆p wi...

متن کامل

Sparse Recovery by Non-convex Optimization -- Instance Optimality

In this note, we address the theoretical properties of ∆p, a class of compressed sensing decoders that rely on lp minimization with 0 < p < 1 to recover estimates of sparse and compressible signals from incomplete and inaccurate measurements. In particular, we extend the results of Candès, Romberg and Tao [4] and Wojtaszczyk [30] regarding the decoder ∆1, based on l 1 minimization, to ∆p with 0...

متن کامل

A remark about orthogonal matching pursuit algorithm

In this note, we investigate the theoretical properties of Orthogonal Matching Pursuit (OMP), a class of decoder to recover sparse signal in compressed sensing. In particular, we show that the OMP decoder can give (p, q) instance optimality for a large class of encoders with 1 ≤ p ≤ q ≤ 2 and (p, q) 6= (2, 2). We also show that, if the encoding matrix is drawn from an appropriate distribution, ...

متن کامل

Stable Signal Recovery from Phaseless Measurements

The aim of this paper is to study the stability of the l1 minimization for the compressive phase retrieval and to extend the instance-optimality in compressed sensing to the real phase retrieval setting. We first show that the m = O(k log(N/k)) measurements is enough to guarantee the l1 minimization to recover k-sparse signals stably provided the measurement matrix A satisfies the strong RIP pr...

متن کامل

Generalized Null Space and Restricted Isometry Properties

We propose a theoretical study of the conditions guaranteeing that a decoder will obtain an optimal signal recovery from an underdetermined set of linear measurements. This special type of performance guarantee is termed instance optimality and is typically related with certain properties of the dimensionality-reducing matrix M. Our work extends traditional results in sparse recovery, where ins...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2008